5 research outputs found

    Software Traceability using Latent Semantic Analysis and Relevance Feedback

    Get PDF
    Software traceability (ST), in its broadest sense, is the process of tracking changes in the document corpus which are created throughout the software development life-cycle. However, traditional ST approaches require a lot of human effort to identify and consistently record inter-dependencies among software artifacts. In this paper we present an approach that reveals traceability links automatically using the information retrieval (IR) techniques of Latent Semantic Analysis (LSA) and Relevance Feedback and present a software tool to implement these ideas. We discuss in detail how software artifacts can be represented in a vector space model and how term extraction and weighting can be accomplished for UML artifacts, such as use-cases, interaction and state diagrams, as well as for source code and natural language text documents. We also explain how structural information which is always inherent in software artifacts can be preserved in the term extraction and weighting phase of creating traceable artifacts. Unlike other tools, we incorporate human knowledge through relevance feedback into the traceability link recovery process with the aim to improve the quality of traceability links. Finally, we illustrate the effectiveness of our tool-based approach and our proposals through a case study with a pilot software project and compare our results with those of a manual tracing process

    Automated Retrieval of Artifacts Created during the Software Development Life-cycle

    Get PDF
    The number of failures of software projects not meeting the originally intended requirements are many. While often due to users and developers not sharing the same vocabulary, it is more often due to changes which are not reported or recorded somewhere along the development cycle. Software traceability (ST), is the process of tracking changes in the document corpus which are created throughout the software development life-cycle. There are known techniques, such as using traceability matrices, which attempt to solve the problem. Such mechanical methods are not only manually intensive, but they totally ignore the effects of synonymy and polysemy. Latent semantic analysis (LSA) is intended to avoid these latter effects and is largely used in the world of Information Retrieval (IR). In this report we apply LSA for the purpose of maintaining artifacts generated during the software development life-cycle and place greater emphasis than hitherto found in the literature, on term extraction in software code, something we call attribute weighting. We moreover present a software tool for the automation of the traceability process, including query refinement and show that the technique allows one to trace through the artifact corpus with the confidence that the set of artifacts affected by a change will be discovered

    Modelling Quality of Service in IEEE 802.16 Networks

    Get PDF
    While only relatively recently standardized, IEEE 802.16 orWiMAX networks are receiving a great deal of attention in both industry and research. This is so because with the increased emphasis on multimedia data, apart from the general advantage of wireless, 802.16 promises wider bandwidth and QoS as part of the standard. As a back haul network for other networks, in particular the 802.11a/b/g/e or WiFi networks, it is well suited. As for any new technology, there are many open questions of which Transmission Scheduling and Connection Admission Control (CAC) are the most prominent. The standard intentionally makes no statement about either function. Different from other performance models we have seen, we consider an analytical framework which takes into account the close relationship between the CAC algorithms and the Scheduler algorithms and is applicable to each mode of operation and admission control paradigm specified by the standard. The long term objective of this work is to present a hybrid analytic and simulation model, based on the proposed framework, for modelling QoS metrics in 802.16 networks

    A Probabilistic Dynamic Technique for the Distributed Generation of Very Large State Spaces

    Get PDF
    Conventional methods for state space exploration are limited to the analysis of small systems because they suffer from excessive memory and computational requirements. We have developed a new dynamic probabilistic state exploration algorithm which addresses this problem for general, structurally unrestricted state spaces. Our method has a low state omission probability and low memory usage that is independent of the length of the state vector. In addition, the algorithm can be easily parallelised. This combination of probability and parallelism enables us to rapidly explore state spaces that are an order of magnitude larger than those obtainable using conventional exhausting techniques. We derive a performance model of this new algorithm in order to quantify its benefits in terms of distributed run-time, speedup and efficiency. We implement our technique on a distributed-memory parallel computer and demonstrate results which compare favourably with the performance model. Finally, we discuss suitable choices for the three hash functions upon which our algorithm is based

    Automation of RAID Controller Operation Using ROSTI

    No full text
    Describes a technique to automate the design and implementation of RAID Controller logic using a formalism of RAID protection schemes
    corecore